Two-phase quasi-Newton method for unconstrained optimization problem
نویسندگان
چکیده
منابع مشابه
On the Behavior of Damped Quasi-Newton Methods for Unconstrained Optimization
We consider a family of damped quasi-Newton methods for solving unconstrained optimization problems. This family resembles that of Broyden with line searches, except that the change in gradients is replaced by a certain hybrid vector before updating the current Hessian approximation. This damped technique modifies the Hessian approximations so that they are maintained sufficiently positive defi...
متن کاملRegularized Newton method for unconstrained convex optimization
We introduce the regularized Newton method (rnm) for unconstrained convex optimization. For any convex function, with a bounded optimal set, the rnm generates a sequence that converges to the optimal set from any starting point. Moreover the rnm requires neither strong convexity nor smoothness properties in the entire space. If the function is strongly convex and smooth enough in the neighborho...
متن کاملReduced-Hessian Quasi-Newton Methods for Unconstrained Optimization
Quasi-Newton methods are reliable and efficient on a wide range of problems, but they can require many iterations if the problem is ill-conditioned or if a poor initial estimate of the Hessian is used. In this paper, we discuss methods designed to be more efficient in these situations. All the methods to be considered exploit the fact that quasi-Newton methods accumulate approximate second-deri...
متن کاملImproved Damped Quasi-Newton Methods for Unconstrained Optimization∗
Recently, Al-Baali (2014) has extended the damped-technique in the modified BFGS method of Powell (1978) for Lagrange constrained optimization functions to the Broyden family of quasi-Newton methods for unconstrained optimization. Appropriate choices for the damped-parameter, which maintain the global and superlinear convergence property of these methods on convex functions and correct the Hess...
متن کاملNew quasi-Newton methods for unconstrained optimization problems
Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. QuasiNewton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is u...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Afrika Matematika
سال: 2019
ISSN: 1012-9405,2190-7668
DOI: 10.1007/s13370-019-00680-5